Goto

Collaborating Authors

 nonconsensual deepfake


28 advocacy groups call on Apple and Google to ban Grok, X over nonconsensual deepfakes

Engadget

Apple's Siri AI will be powered by Gemini Neither company has responded to Engadget's request for comment. The two (frequently virtue-signaling) companies have inexplicably allowed Grok and X to remain in their app stores -- even as Musk's chatbot reportedly continues to produce the material. On Wednesday, a coalition of women's and progressive advocacy groups called on Tim Cook and Sundar Pichai to uphold their own rules and remove the apps. The open letters to Apple and Google were signed by 28 groups. Among them are the women's advocacy group Ultraviolet, the parents' group ParentsTogether Action and the National Organization for Women.


Spreading AI-generated content could lead to expensive fines

Popular Science

AI-generated "deepfake" materials are flooding the internet, sometimes with dangerous results. In just the last year, AI has been used to make deceiving voice clones of a former US president and spread fake, politically-charged images depicting children in natural disasters. Nonconsensual, AI-generated sexual images and videos, meanwhile, are leaving a trail of trauma impacting everyone from high schoolers to Taylor Swift. Large tech companies like Microsoft and Meta have made some efforts to identify instances of AI manipulation but with only muted success. Now, governments are stepping in to try and stem the tide with something they know quite a bit about: fines.


The US Needs Deepfake Porn Laws. These States Are Leading the Way

WIRED

As national legislation on deepfake pornography crawls its way through Congress, states across the country are trying to take matters into their own hands. Thirty-nine states have introduced a hodgepodge of laws designed to deter the creation of nonconsensual deepfakes and punish those who make and share them. Earlier this year, Democratic congresswoman Alexandria Ocasio-Cortez, herself a victim of nonconsensual deepfakes, introduced the Disrupt Explicit Forged Images and Non-Consensual Edits Act, or Defiance Act. If passed, the bill would allow victims of deepfake pornography to sue as long as they could prove the deepfakes had been made without their consent. In June, Republican senator Ted Cruz introduced the Take It Down Act, which would require platforms to remove both revenge porn and nonconsensual deepfake porn.


Three ways we can fight deepfake porn

MIT Technology Review

Of all types of harm related to generative AI, nonconsensual deepfakes affect the largest number of people, with women making up the vast majority of those targeted, says Henry Ajder, an AI expert who specializes in generative AI and synthetic media. Thankfully, there is some hope. New tools and laws could make it harder for attackers to weaponize people's photos, and they could help us hold perpetrators accountable. Here are three ways we can combat nonconsensual deepfake porn. Social media platforms sift through the posts that are uploaded onto their sites and take down content that goes against their policies.


If Taylor Swift Can't Defeat Deepfake Porn, No One Can

WIRED

If anyone can rally up a base, it's Taylor Swift. When sexually explicit, likely AI-generated images of Swift circulated on social media this week, it galvanized her fans. Swifties found phrases and hashtags related to the images and flooded them with videos and photos of Swift performing. "Protect Taylor Swift" went viral, trending as Swifties spoke out against not just the Swift deepfakes, but all nonconsensual, explicit images made of women. Swift, arguably the most famous woman in the world right now, has become the high-profile victim of an all-too-frequent form of harassment.


Porn Sites Still Won't Take Down Nonconsensual Deepfakes

WIRED

Hundreds of explicit deepfake videos featuring female celebrities, actresses, and musicians are being uploaded to the world's biggest pornography websites every month, new analysis shows. The nonconsensual videos rack up millions of views, and porn companies are still failing to remove them from their websites. This story originally appeared on WIRED UK. Up to 1,000 deepfake videos have been uploaded to porn sites every month as they became increasingly popular during 2020, figures from deepfake detection company Sensity show. The videos continue to break away from dedicated deepfake pornography communities and into the mainstream.